Proximal Mapping for Symmetric Penalty and Sparsity
نویسندگان
چکیده
This paper studies a class of problems consisting of minimizing a continuously differentiable function penalizedwith the so-called `0-norm over a symmetric set. These problems are hard to solve, yet prominent in many fields and applications.We first study the proximal mapping with respect to the `0-norm over symmetric sets, and provide an efficient method to attainit. The method is then improved for symmetric sets satisfying a sub-modularity-like property, which we call “second ordermonotonicity” (SOM). It is shown that many important symmetric sets, such as the `1, `2, `∞-balls, the simplex and the full-simplex, satisfy this SOM property. We then develop, under the validity of the SOM property, necessary optimality conditions,and corresponding algorithms that are guaranteed to converge to points satisfying the aforementioned optimality conditions. Weprove the existence of a hierarchy between the optimality conditions, and consequently between the corresponding algorithms.
منابع مشابه
Tree-guided Group Lasso for Multi-response Regression with Structured Sparsity, with an Application to Eqtl Mapping1 by Seyoung Kim
We consider the problem of estimating a sparse multi-response regression function, with an application to expression quantitative trait locus (eQTL) mapping, where the goal is to discover genetic variations that influence gene-expression levels. In particular, we investigate a shrinkage technique capable of capturing a given hierarchical structure over the responses, such as a hierarchical clus...
متن کاملTree-guided Group Lasso for Multi-response Regression with Structured Sparsity, with an Application to Eqtl
We consider the problem of estimating a sparse multi-response regression function, with an application to expression quantitative trait locus (eQTL) mapping, where the goal is to discover genetic variations that influence gene-expression levels. In particular, we investigate a shrinkage technique capable of capturing a given hierarchical structure over the responses, such as a hierarchical clus...
متن کاملOptimization Problems Involving Group Sparsity Terms
This paper studies a general form problem in which a lower bounded continuously differentiable function is minimized over a block separable set incorporating a group sparsity expression as a constraint or a penalty (or both) in the group sparsity setting. This class of problems is generally hard to solve, yet highly applicable in numerous practical settings. Particularly, we study the proximal ...
متن کاملSMOOTHING PROXIMAL GRADIENT METHOD FOR GENERAL STRUCTURED SPARSE REGRESSION By
We study the problem of estimating high dimensional regression models regularized by a structured sparsity-inducing penalty that encodes prior structural information on either the input or output variables. We consider two widely adopted types of penalties of this kind as motivating examples: 1) the general overlapping-group-lasso penalty, generalized from the group-lasso penalty; and 2) the gr...
متن کاملA Smoothing Proximal Gradient Method for General Structured Sparse Regression
We study the problem of estimating high dimensional regression models regularized by a structured sparsity-inducing penalty that encodes prior structural information on either the input or output variables. We consider two widely adopted types of penalties of this kind as motivating examples: 1) the general overlapping-group-lasso penalty, generalized from the group-lasso penalty; and 2) the gr...
متن کامل